사전 학습 모델

A pre-trained model refers to a model or a saved network created by someone else and trained on a large dataset to solve a similar problem. AI teams can use a pre-trained model as a starting point, instead of building a model from scratch. Examples of successful large-scale pre-trained language models are Bidirectional Encoder Representations from Transformers (BERT) and the Generative Pre-trained Transformer (GPT-n) series.

2024 © ak